Enhanced Nonconvex Low-Rank Approximation of Tensor Multi-Modes for Tensor Completion
نویسندگان
چکیده
Higher-order low-rank tensor arises in many data processing applications and has attracted great interests. Inspired by approximation theory, researchers have proposed a series of effective completion methods. However, most these methods directly consider the global low-rankness underlying tensors, which is not sufficient for low sampling rate; addition, single nuclear norm or its relaxation usually adopted to approximate rank function, would lead suboptimal solution deviated from original one. To alleviate above problems, this paper, we propose novel multi-modes (LRATM), double nonconvex L ? designed represent joint-manifold drawn factorization factors each mode tensor. A block successive upper-bound minimization method-based algorithm efficiently solve model, it can be demonstrated that our numerical scheme converges coordinate-wise minimizers. Numerical results on three types public multi-dimensional datasets tested shown recover variety tensors with significantly fewer samples than compared
منابع مشابه
Efficient tensor completion: Low-rank tensor train
This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...
متن کاملLow-rank Tensor Approximation
Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the...
متن کاملLow-Rank Tensor Completion by Riemannian Optimization∗
In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low-rank constraint. We propose a new algorithm that performs Riemannian optimization techniques on the manifold of tensors of fixed multilinear rank. More specifically, a variant of the nonlinear conjugate gradient method is developed. Paying particular attention to the efficient implementation, ou...
متن کاملCross: Efficient Low-rank Tensor Completion
The completion of tensors, or high-order arrays, attracts significant attention in recent research. Current literature on tensor completion primarily focuses on recovery from a set of uniformly randomly measured entries, and the required number of measurements to achieve recovery is not guaranteed to be optimal. In addition, the implementation of some previous methods are NP-hard. In this artic...
متن کاملRelative Error Tensor Low Rank Approximation
We consider relative error low rank approximation of tensors with respect to the Frobenius norm. Namely, given an order-q tensor A ∈ R ∏q i=1 ni , output a rank-k tensor B for which ‖A − B‖F ≤ (1 + ) OPT, where OPT = infrank-k A′ ‖A − A‖F . Despite much success on obtaining relative error low rank approximations for matrices, no such results were known for tensors. One structural issue is that ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE transactions on computational imaging
سال: 2021
ISSN: ['2333-9403', '2573-0436']
DOI: https://doi.org/10.1109/tci.2021.3053699